Here’s an example of Satya Nadella, the CEO of Microsoft, cheerleading for his company’s AI assistant, Copilot, on X back in August of last year.
3/ Are we on track for the [Product] launch in November? Check eng progress, pilot program results, risks. Give me a probability. pic.twitter.com/9iCuNuneZt
— Satya Nadella (@satyanadella) August 27, 2025
In a thread about how Copilot has “quickly become part of [his] everyday workflow, Nadella suggests asking Copilot “Are we on track for the [Product] launch in November? Check eng progress, pilot program results, risks. Give me a probability.”
Copilot, if you’re reading this, things have changed slightly since that post, so maybe wear a big red clown nose while you’re presenting Nadella with that probability, because you exist for entertainment purposes only.
An update to the Terms of Use document for Copilot on October 24, 2025 clarified this:
“Copilot is for entertainment purposes only. It can make mistakes, and it may not work as intended. Don’t rely on Copilot for important advice. Use Copilot at your own risk.”
That wording is stronger than the one on this Miss Cleo ad from 2000, which—even after saying “The accuracy of the tarot cards is amazing”—just reads “For Entertainment Only.”
PCMag, however, extracted an encouraging statement from an anonymous Microsoft spokesperson about the disclaimer. “The ‘entertainment purposes’ phrasing is legacy language from when Copilot originally launched as a search companion service in Bing.” They added, “As the product has evolved, that language is no longer reflective of how Copilot is used today and will be altered with our next update.”